翻訳と辞書
Words near each other
・ Boots Woodall
・ Boots! Boots!
・ Boots, Boots, Boots
・ Bootsauce
・ Bootsie
・ Bootsie and Snudge
・ Bootsie Barnes
・ Bootsie Books
・ Bootsie Calhoun
・ Bootsie Neal
・ BootSkin
・ Bootsmann
・ Bootsmannsmaat
・ Bootsplash
・ Bootstrap (front-end framework)
Bootstrap aggregating
・ Bootstrap current
・ Bootstrap curriculum
・ Bootstrap error-adjusted single-sample technique
・ Bootstrap model
・ Bootstrap Network
・ Bootstrap Productions
・ Bootstrap Protocol
・ BootstrapCDN
・ Bootstrappers
・ Bootstrappers (album)
・ Bootstrappers (band)
・ Bootstrapping
・ Bootstrapping (biology)
・ Bootstrapping (compilers)


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Bootstrap aggregating : ウィキペディア英語版
Bootstrap aggregating

Bootstrap aggregating, also called bagging, is a machine learning ensemble meta-algorithm designed to improve the stability and accuracy of machine learning algorithms used in statistical classification and regression. It also reduces variance and helps to avoid overfitting. Although it is usually applied to decision tree methods, it can be used with any type of method. Bagging is a special case of the model averaging approach.
==Description of the technique==
Given a standard training set ''D'' of size ''n'', bagging generates ''m'' new training sets D_i, each of size ''n′'', by sampling from ''D'' uniformly and with replacement. By sampling with replacement, some observations may be repeated in each D_i. If ''n''=''n'', then for large ''n'' the set D_i is expected to have the fraction (1 - 1/''e'') (≈63.2%) of the unique examples of ''D'', the rest being duplicates.〔Aslam, Javed A.; Popa, Raluca A.; and Rivest, Ronald L. (2007); (''On Estimating the Size and Confidence of a Statistical Audit'' ), Proceedings of the Electronic Voting Technology Workshop (EVT '07), Boston, MA, August 6, 2007. More generally, when drawing with replacement ''n′'' values out of a set of ''n'' (different and equally likely), the expected number of unique draws is n(1 - e^).〕 This kind of sample is known as a bootstrap sample. The ''m'' models are fitted using the above ''m'' bootstrap samples and combined by averaging the output (for regression) or voting (for classification).
Bagging leads to "improvements for unstable procedures" (Breiman, 1996), which include, for example, artificial neural networks, classification and regression trees, and subset selection in linear regression (Breiman, 1994). An interesting application of bagging showing improvement in preimage learning is provided here.〔Sahu, A., Runger, G., Apley, D., Image denoising with a multi-phase kernel principal component approach and an ensemble version, IEEE Applied Imagery Pattern Recognition Workshop, pp.1-7, 2011.〕〔Shinde, Amit, Anshuman Sahu, Daniel Apley, and George Runger. "Preimages for Variation Patterns from Kernel PCA and Bagging." IIE Transactions, Vol.46, Iss.5, 2014〕 On the other hand, it can mildly degrade the performance of stable methods such as K-nearest neighbors (Breiman, 1996).

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Bootstrap aggregating」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.